3,828 research outputs found

    A new data assimilation procedure to develop a debris flow run-out model

    Get PDF
    Abstract Parameter calibration is one of the most problematic phases of numerical modeling since the choice of parameters affects the model\u2019s reliability as far as the physical problems being studied are concerned. In some cases, laboratory tests or physical models evaluating model parameters cannot be completed and other strategies must be adopted; numerical models reproducing debris flow propagation are one of these. Since scale problems affect the reproduction of real debris flows in the laboratory or specific tests used to determine rheological parameters, calibration is usually carried out by comparing in a subjective way only a few parameters, such as the heights of soil deposits calculated for some sections of the debris flows or the distance traveled by the debris flows using the values detected in situ after an event has occurred. Since no automatic or objective procedure has as yet been produced, this paper presents a numerical procedure based on the application of a statistical algorithm, which makes it possible to define, without ambiguities, the best parameter set. The procedure has been applied to a study case for which digital elevation models of both before and after an important event exist, implicating that a good database for applying the method was available. Its application has uncovered insights to better understand debris flows and related phenomena

    Debris flood hazard documentation and mitigation on the Tilcara alluvial fan (Quebrada de Humahuaca, Jujuy province, North-West Argentina)

    Get PDF
    Abstract. For some decades, mass wasting processes such as landslides and debris floods have been threatening villages and transportation routes in the Rio Grande Valley, named Quebrada de Humauhuaca. One of the most significant examples is the urban area of Tilcara, built on a large alluvial fan. In recent years, debris flood phenomena have been triggered in the tributary valley of the Huasamayo Stream and reached the alluvial fan on a decadal basis. In view of proper development of the area, hazard and risk assessment together with risk mitigation strategies are of paramount importance. The need is urgent also because the Quebrada de Humahuaca was recently included in the UNESCO World Cultural Heritage. Therefore, the growing tourism industry may lead to uncontrolled exploitation and urbanization of the valley, with a consequent increase of the vulnerability of the elements exposed to risk. In this context, structural and non structural mitigation measures not only have to be based on the understanding of natural processes, but also have to consider environmental and sociological factors that could hinder the effectiveness of the countermeasure works. The hydrogeological processes are described with reference to present-day hazard and risk conditions. Considering the socio-economic context, some possible interventions are outlined, which encompass budget constraints and local practices. One viable solution would be to build a protecting dam upstream of the fan apex and an artificial channel, in order to divert the floodwaters in a gully that would then convey water and sediments into the Rio Grande, some kilometers downstream of Tilcara. The proposed remedial measures should employ easily available and relatively cheap technologies and local workers, incorporating low environmental and visual impacts issues, in order to ensure both the future conservation of the site and its safe exploitation for inhabitants and tourists

    Acceptability with general orderings

    Full text link
    We present a new approach to termination analysis of logic programs. The essence of the approach is that we make use of general orderings (instead of level mappings), like it is done in transformational approaches to logic program termination analysis, but we apply these orderings directly to the logic program and not to the term-rewrite system obtained through some transformation. We define some variants of acceptability, based on general orderings, and show how they are equivalent to LD-termination. We develop a demand driven, constraint-based approach to verify these acceptability-variants. The advantage of the approach over standard acceptability is that in some cases, where complex level mappings are needed, fairly simple orderings may be easily generated. The advantage over transformational approaches is that it avoids the transformation step all together. {\bf Keywords:} termination analysis, acceptability, orderings.Comment: To appear in "Computational Logic: From Logic Programming into the Future

    The ss-semantics approach; theory and applications

    Get PDF
    AbstractThis paper is a general overview of an approach to the semantics of logic programs whose aim is to find notions of models which really capture the operational semantics, and are, therefore, useful for defining program equivalences and for semantics-based program analysis. The approach leads to the introduction of extended interpretations which are more expressive than Herbrand interpretations. The semantics in terms of extended interpretations can be obtained as a result of both an operational (top-down) and a fixpoint (bottom-up) construction. It can also be characterized from the model-theoretic viewpoint, by defining a set of extended models which contains standard Herbrand models. We discuss the original construction modeling computed answer substitutions, its compositional version, and various semantics modeling more concrete observables. We then show how the approach can be applied to several extensions of positive logic programs. We finally consider some applications, mainly in the area of semantics-based program transformation and analysis

    Correlation of X-Ray CT Measurements to Shear Strength in Pultruded Composite Materials

    Get PDF
    Pultrusion is an emerging, economical manufacturing process for composite structures. In a pultrusion system, the composite tapes and fabrics are loaded onto a creel, and the materials are fed into a preform (or shaper), along with any fillers that may be needed. If the fiber is not yet preimpregnated with resin, it is run through a resin bath or resin is injected into the die the material is about to enter. The composite is pulled through the heated die and then cut from the system to produce either a fully or partially cured product. This handleable part is then placed in an autoclave for final cure. A number of variables go into the pultrusion process, including the type of fibers, the resin matrix material, pull rate and cure temperature. Destructive testing, such as shear testing of small sections, is the normal method for assessing the quality of the pultrusion manufacturing product. During manufacture, this cannot be performed on the actual product to be used but only on near neighbor test coupons. This can be time consuming, costly, and part of the product is destroyed

    A generic framework for context-sensitive analysis of modular programs

    Get PDF
    Context-sensitive analysis provides information which is potentially more accurate than that provided by context-free analysis. Such information can then be applied in order to validate/debug the program and/or to specialize the program obtaining important improvements. Unfortunately, context-sensitive analysis of modular programs poses important theoretical and practical problems. One solution, used in several proposals, is to resort to context-free analysis. Other proposals do address context-sensitive analysis, but are only applicable when the description domain used satisfies rather restrictive properties. In this paper, we argüe that a general framework for context-sensitive analysis of modular programs, Le., one that allows using all the domains which have proved useful in practice in the non-modular setting, is indeed feasible and very useful. Driven by our experience in the design and implementation of analysis and specialization techniques in the context of CiaoPP, the Ciao system preprocessor, in this paper we discuss a number of design goals for context-sensitive analysis of modular programs as well as the problems which arise in trying to meet these goals. We also provide a high-level description of a framework for analysis of modular programs which does substantially meet these objectives. This framework is generic in that it can be instantiated in different ways in order to adapt to different contexts. Finally, the behavior of the different instantiations w.r.t. the design goals that motivate our work is also discussed
    corecore